62 research outputs found

    Προχωρημένες μέθοδοι προσομοίωσης στοχαστικών πεπερασμένων στοιχείων και αξιοπιστίας των κατασκευών

    Get PDF
    204 σ.Στα πλαίσια της παρούσας διδακτορικής διατριβής, για την αντιμετώπιση του υπολογιστικού κόστους που έχει η μέθοδος Monte Carlo, διατυπώνονται αρχικά δυο μεθοδολογίες για τον υπολογισμό της πιθανότητας αστοχίας ενός στοχαστικού συστήματος. Στην πρώτη μεθοδολογία τα τεχνητά νευρωνικά δίκτυα χρησιμοποιούνται στο πλαίσιο της μεθόδου των υποσυνόλων ενώ στη δεύτερη μεθοδολογία χρησιμοποιούνται στα πλαίσια της προσομοίωσης Monte Carlo. Οι δύο αυτές μέθοδοι έχουν ως αποτέλεσμα τη μείωση του υπολογιστικού κόστους τόσο της μεθόδου των υποσυνόλων όσο και της Monte Carlo. Στη συνέχεια πραγματοποιείται μια προσαρμοστική διατύπωση της μεθόδου των φασματικών στοχαστικών πεπερασμένων στοιχείων με μεθόδους Galerkin χρησιμοποιώντας τη συνάρτηση διακύμανσης της απόκρισης για τη εύρεση της χωρικής κατανομής των όρων της Karhunen-Loeve, οδηγώντας σε μείωση των συντελεστών της πολυωνυμικής βάσης που πρέπει να υπολογιστού, και συνεπώς μειώνοντας την πυκνότητα των διευρυμένων μητρώων της μεθόδου. Η προσαρμοστική αυτή διατύπωση σε συνδυασμό με επαναληπτικές μεθόδους επίλυσης βελτιώνει την υπολογιστική απόδοση της μεθόδου. Τέλος, πραγματοποιείται μια παραμετρική διερεύνηση της συμπεριφοράς της μεθόδου των φασματικών πεπερασμένων στοιχείων για διάφορες τιμές παραμέτρων του στοχαστικού πεδίου, η οποία χρησιμοποιείται στα πλαίσια της εκτίμησης της υπολογιστικής συμπεριφοράς της μεθόδου σε σχέση με τη μέθοδο Monte Carlo.This thesis presents a series of methodologies that have been implemented in the framework of SFEM and reliability analysis, in order to reduce the computational effort involved. The first methodology is a neural network-based subset simulation in which neural networks are trained and then used as robust meta-models in order to increase the efficiency of subset simulation with a minimum additional computational effort. In the second methodology neural networks are used in the framework of MCS for computing the reliability of stochastic structural systems, by providing robust neural network estimates of the structural response. The third methodology consists of constructing an adaptive sparse polynomial chaos (PC) expansion of the response of stochastic systems in the framework of spectral stochastic finite element method (SSFEM). The proposed methodology utilizes the concept of variability response function (VRF) in order to compute an a priori low cost estimation of the spatial distribution of the second-order error of the response as a function of the number of terms used in the truncated Karhunen-Loeve series representation of the random field involved in the problem. Finally, a parametric study of Monte Carlo simulation versus SSFEM in large-scale systems is performed.Δημήτρης Γ. Γιοβάνη

    Machine Learning for the identification of phase-transitions in interacting agent-based systems

    Full text link
    Deriving closed-form, analytical expressions for reduced-order models, and judiciously choosing the closures leading to them, has long been the strategy of choice for studying phase- and noise-induced transitions for agent-based models (ABMs). In this paper, we propose a data-driven framework that pinpoints phase transitions for an ABM in its mean-field limit, using a smaller number of variables than traditional closed-form models. To this end, we use the manifold learning algorithm Diffusion Maps to identify a parsimonious set of data-driven latent variables, and show that they are in one-to-one correspondence with the expected theoretical order parameter of the ABM. We then utilize a deep learning framework to obtain a conformal reparametrization of the data-driven coordinates that facilitates, in our example, the identification of a single parameter-dependent ODE in these coordinates. We identify this ODE through a residual neural network inspired by a numerical integration scheme (forward Euler). We then use the identified ODE -- enabled through an odd symmetry transformation -- to construct the bifurcation diagram exhibiting the phase transition.Comment: 14 pages, 9 Figure

    High-dimensional interpolation on the Grassmann manifold using Gaussian processes

    Get PDF
    This paper proposes a novel method for performing interpolation of high-dimensional systems. The proposed method projects the high-dimensional full-field solution into a lower-dimensional space where interpolation is computationally more tractable. The method combines the spectral clustering technique, which refers to a class of machine learning techniques that utilizes the eigen-structure of a similarity matrix to partition data into disjoint clusters based on the similarity of the points, in order to effectively identify areas of the parameter space where sharp changes of the solution field are resolved. In order to do this, we derive a similarity matrix based on the pairwise distances between the high-dimensional solutions of the stochastic system projected onto the Grassmann manifold. The distances are calculated using appropriately defined metrics and the similarity matrix is used in order to cluster the data based on their similarity. Points that belong to the same cluster are projected onto the tangent space (which is an inner-product flat space) defined at the Karcher mean of these points and a Gaussian process is used for interpolation on the tangent of the Grassmann manifold in order to predict the solution without requiring full model evaluations.Methodological developments presented herein have been supported by the Office of Naval Research

    Structural reliability analysis from sparse data

    Get PDF
    Over the past several decades, major advances have been made in probabilistic methods for assessing structural reliability with a critical feature of these methods being that probability models of random variables are known precisely. However, when data are scant it is rear to identify a unique probability distribution that fits the data, a fact that introduces uncertainty into the estimation of the probability of failure since the location of the limit surface in the probability space is also uncertain. The objective of the proposed work is to realistically assess the uncertainty in probability of failure estimates of the First Order Reliability Method (FORM) resulting from the limited amount of data.Methodological developments presented herein have been supported by the Office of Naval Research with Dr. Paul Hess as program officer

    Integrating Multiple Sources of Knowledge for the Intelligent Detection of Anomalous Sensory Data in a Mobile Robot

    Get PDF
    For service robots to expand in everyday scenarios they must be able to identify and manage abnormal situations intelligently. In this paper we work at a basic sensor level, by dealing with raw data produced by diverse devices subjected to some negative circumstances such as adverse environmental conditions or difficult to perceive objects. We have implemented a probabilistic Bayesian inference process for deducing whether the sensors are working nominally or not, which abnormal situation occurs, and even to correct their data. Our inference system works by integrating in a rigorous and homogeneous mathematical framework multiple sources and modalities of knowledge: human expert, external information systems, application-specific and temporal. The results on a real service robot navigating in a structured mixed indoor-outdoor environment demonstrate good detection capabilities and set a promising basis for improving robustness and safety in many common service tasks.Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech

    A multi-center, real-life experience on liquid biopsy practice for EGFR testing in non-small cell lung cancer (NSCLC) patients

    Get PDF
    Background: circulating tumor DNA (ctDNA) is a source of tumor genetic material for EGFR testing in NSCLC. Real-word data about liquid biopsy (LB) clinical practice are lacking. The aim of the study was to describe the LB practice for EGFR detection in North Eastern Italy. Methods: we conducted a multi-regional survey on ctDNA testing practices in lung cancer patients. Results: Median time from blood collection to plasma separation was 50 min (20\u2013120 min), median time from plasma extraction to ctDNA analysis was 24 h (30 min\u20135 days) and median turnaround time was 24 h (6 h\u20135 days). Four hundred and seventy five patients and 654 samples were tested. One hundred and ninety-two patients were tested at diagnosis, with 16% EGFR mutation rate. Among the 283 patients tested at disease progression, 35% were T790M+. Main differences in LB results between 2017 and 2018 were the number of LBs performed for each patient at disease progression (2.88 vs. 1.2, respectively) and the percentage of T790M+ patients (61% vs. 26%)

    Masonry compressive strength prediction using artificial neural networks

    Get PDF
    The masonry is not only included among the oldest building materials, but it is also the most widely used material due to its simple construction and low cost compared to the other modern building materials. Nevertheless, there is not yet a robust quantitative method, available in the literature, which can reliably predict its strength, based on the geometrical and mechanical characteristics of its components. This limitation is due to the highly nonlinear relation between the compressive strength of masonry and the geometrical and mechanical properties of the components of the masonry. In this paper, the application of artificial neural networks for predicting the compressive strength of masonry has been investigated. Specifically, back-propagation neural network models have been used for predicting the compressive strength of masonry prism based on experimental data available in the literature. The comparison of the derived results with the experimental findings demonstrates the ability of artificial neural networks to approximate the compressive strength of masonry walls in a reliable and robust manner.- (undefined
    corecore